# Semantic similarity

Silma Embedding Matryoshka V0.1
Apache-2.0
SILMA Arabic Matryoshka Embedding Model 0.1 is an advanced Arabic text embedding model that uses innovative matryoshka embedding technology to optimize text representation in different dimensions, balancing speed, storage, and accuracy.
Text Embedding Supports Multiple Languages
S
silma-ai
446
11
Kf Deberta Multitask
This is a Korean sentence embedding model based on sentence-transformers, capable of mapping sentences and paragraphs into a 768-dimensional dense vector space, suitable for tasks such as clustering or semantic search.
Text Embedding Transformers Korean
K
upskyy
1,866
15
Simcse Model Phayathaibert
Apache-2.0
This is a model based on sentence-transformers that can map sentences and paragraphs into a 768-dimensional dense vector space, suitable for tasks such as clustering or semantic search.
Text Embedding Transformers
S
kornwtp
123
2
Sentence Transformers Gte Base
This is a sentence embedding model based on sentence-transformers, capable of mapping sentences and paragraphs into a 768-dimensional vector space, suitable for tasks such as semantic search and clustering.
Text Embedding
S
embaas
43
0
Test Food
This is a sentence-transformers based model that can map sentences and paragraphs into a 768-dimensional dense vector space, suitable for tasks like sentence similarity computation and semantic search.
Text Embedding Transformers
T
Linus4Lyf
42
0
Parameter Mini Lds
A patent parameter sentence recognition model based on all-MiniLM-L6-v2, which maps sentences and paragraphs into a 384-dimensional dense vector space, suitable for tasks like clustering or semantic search.
Text Embedding
P
nategro
14
0
Roberta Ko Small Tsdae
MIT
This is a Korean small RoBERTa model based on sentence-transformers, capable of mapping sentences and paragraphs into a 256-dimensional dense vector space, suitable for tasks such as clustering or semantic search.
Text Embedding Transformers Korean
R
smartmind
39
2
Bertimbaulaw Base Portuguese Cased
MIT
This model is a fine-tuned version of the Portuguese BERT base model (neuralmind/bert-base-portuguese-cased), with unspecified specific tasks
Large Language Model Transformers
B
alfaneo
47
1
Bpr Gpl Bioasq Base Msmarco Distilbert Tas B
This is a sentence similarity model based on sentence-transformers that can map sentences and paragraphs to a 768-dimensional dense vector space, suitable for tasks such as semantic search and clustering.
Text Embedding Transformers
B
income
41
0
Nfcorpus Msmarco Distilbert Gpl
This is a sentence-transformers based model that maps sentences and paragraphs to a 768-dimensional dense vector space, suitable for tasks like clustering or semantic search.
Text Embedding Transformers
N
GPL
439
0
Model Paraphrase Multilingual MiniLM L12 V2 100 Epochs
This is a model based on sentence-transformers that can map sentences and paragraphs to a 384-dimensional dense vector space, suitable for tasks such as sentence similarity calculation and semantic search.
Text Embedding Transformers
M
jfarray
13
0
Ko Sroberta Multitask
This is a Korean sentence embedding model based on sentence-transformers, capable of mapping sentences and paragraphs into a 768-dimensional dense vector space, suitable for tasks such as clustering or semantic search.
Text Embedding Korean
K
jhgan
162.23k
115
Bert Retriever Squad2
This is a sentence embedding model based on sentence-transformers that can convert text into a 768-dimensional vector representation, suitable for tasks such as semantic similarity and text clustering.
Text Embedding Transformers
B
pinecone
36
0
Roberta Base Wechsel Chinese
MIT
A RoBERTa Chinese model trained with the WECHSEL method, achieving efficient cross-lingual transfer from English to Chinese
Large Language Model Transformers Chinese
R
benjamin
16
2
Ko Sbert Multitask
This is a Korean sentence embedding model based on sentence-transformers, capable of mapping sentences and paragraphs into a 768-dimensional dense vector space.
Text Embedding
K
jhgan
7,030
17
Simcse Model Roberta Base Thai
This is a sentence-transformers model based on XLM-R, specifically optimized for Thai language, capable of mapping sentences and paragraphs into a 768-dimensional dense vector space.
Text Embedding Transformers
S
mrp
69
2
Model Dccuchile Bert Base Spanish Wwm Uncased 1 Epochs
This is a sentence embedding model based on sentence-transformers, which can map text to a 256-dimensional vector space and is suitable for semantic search and clustering tasks.
Text Embedding
M
jfarray
8
0
Sentencetransformer Bert Hinglish Small
This is a small BERT-based Hinglish (Hindi-English mixed) sentence transformer model that maps text to a 768-dimensional vector space
Text Embedding Transformers
S
aditeyabaral
20
0
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase